Ravi Vishwakarma is a dedicated Software Developer with a passion for crafting efficient and innovative solutions. With a keen eye for detail and years of experience, he excels in developing robust software systems that meet client needs. His expertise spans across multiple programming languages and technologies, making him a valuable asset in any software development project.
ICSM Computer
08-Jul-2025Efficiently batch inserting or updating large datasets in IndexedDB (especially 10,000+ records) requires careful use of transactions, chunking, and avoiding memory bottlenecks. Here's how you can do it efficiently and safely:
1. Use Bulk APIs (
bulkAdd,bulkPut) with Dexie.jsIf you're using Dexie.js, it provides
bulkAddandbulkPutwhich are highly optimized.bulkAdd()→ Inserts new items (fails on existing keys)bulkPut()→ Inserts or updates (upsert)2. Chunk the Data (to avoid memory/transaction limits)
IndexedDB transactions have limits (browser- and platform-dependent). Safe chunk size: 500–1000 records per transaction.
⚠️ Without chunking, large sets (e.g. 100k records) may crash or freeze the browser.
3. Use Explicit Transactions (Vanilla IndexedDB)
If not using Dexie:
But this is slower and harder to manage than Dexie.
4. Update Efficiently Using
modify()in DexieEfficient for bulk updating in-place using a predicate condition.
5. Avoid JSON.parse/stringify for Huge Sets
Handling 100K records as a full JSON array in memory is expensive. Process data in streams or generators if source is large (e.g., from file or API).
6. Wrap in Transactions for Speed and Atomicity
Dexie:
Vanilla:
Transactions batch writes and are faster than calling
putoutside a transaction.7. Parallel vs Sequential Batching
Don’t write all chunks in parallel (can overload IndexedDB).
Prefer:
Avoid:
Summary Table
bulkPut()(Dexie)modify()